1,411 research outputs found

    Ew gross! Recognition of Expressions of Disgust by Children With Obsessive-Compulsive Disorder

    Get PDF
    There is evidence suggesting that obsessive-compulsive disorder (OCD) in adults may be associated with an impaired ability to recognise the facial expression of disgust (Sprengelmeyer et al., 1997a; Woody, Corcoran, & Tolin, in press). It has been suggested that this impairment begins in childhood when the recognition of emotional expressions is being learnt (see Spengelmeyer et al., 1997a). This study compared the recognition of facial affect in children aged around 11 years with a diagnosis of obsessive-compulsive disorder (OCD; n = 11), other anxiety disorders (n = 20), and nonclinical children (n = 19), adapting the methodology of Sprengelmeyer et al. Disgust was most commonly misclassified as anger by children in all three groups. However, children with OCD did not show any evidence of a recognition deficit for disgust in comparison to either control group. Unexpectedly, however, children with OCD recognised expressions of surprise more accurately than nonclinical children. Recognition of disgust or any other emotion was not related to child self-reported anxiety symptoms. Given the observed differences in some studies with adults, future research may benefit by examining older adolescents and young adults to determine when these effects may first be notice

    Seeing is knowing? Visual word recognition in non-dyslexic and dyslexic readers: an ERP study

    Get PDF
    The aim of the current study was to investigate whether phonological/semantic processing of the word takes place simultaneously with, or following, the early processing of its visual features. Event related potentials (ERPs) were recorded in 13 dyslexic (four female) and 14 non-dyslexic (six female) native English speaking young adults in two lexical decision tasks. In Task 1 participants had to make an orthographic lexical decision to distinguish frequently used words (W) from pseudohomophones (PH1)focusing on visual properties of stimuli. In Task 2 they had to make a phonological lexical decision—to pseudohomophones (PH2) and pseudowords (PW) and decide whether stimuli sounded like real words—focusing on non-visual higher order, i.e., phonological and semantic, processing of the stimuli. The behavioural performance was less good and the ERP peaks’ latency longer in dyslexics compared to controls. The reaction times (RTs) and the number of errors (reversed for the controls in Task 2) increased across four conditions for both groups in the following order: W< PH1< PH2< PW. The ERPs were larger in Task 2 compared to Task 1 starting at 100 ms (P1) for the controls and from about 220 ms (P2) for the dyslexics. The latency of N2 peak in left occipito-temporal sites was larger (as was the number of errors) in PH2 compared to PW condition in controls only, which indicates phonological/semantic specific processing at a time latency of 250–260 ms. Thus, the visual task required less effort than the phonological task, dyslexics’ behavioural performance was less good and the brain activation delayed compared to controls. Combined behavioural and ERP results of this study indicated that phonological/semantic processing of the word took place 150 ms after processing of its visual features in controls and possibly later in dyslexics

    A gender difference in false recall of negative words: Women DRM more than men

    Get PDF
    Gender differences in susceptibility to associative memory illusions in the Deese/Roediger-McDermott paradigm were investigated using negative and neutral word lists. Women (n = 50) and men (n = 50) studied 20 lists of 12 words that were associates of a nonpresented critical lure. Ten lists were associates of negatively-valenced lures (e.g., cry, evil) and ten were associates of neutral lures (e.g., chair, slow). When asked to recall the words after each list, women falsely recalled more negative lures than men, but there was no gender difference in the false recall of neutral lures. These findings suggest that women reflect on associations within negative lists to a greater degree than men and are thereby more likely to generate the negative critical lures

    How strongly do word reading times and lexical decision times correlate? Combining data from eye movement corpora and megastudies

    No full text
    We assess the amount of shared variance between three measures of visual word recognition latencies: eye movement latencies, lexical decision times and naming times. After partialling out the effects of word frequency and word length, two well-documented predictors of word recognition latencies, we see that 7-44% of the variance is uniquely shared between lexical decision times and naming times, depending on the frequency range of the words used. A similar analysis of eye movement latencies shows that the percentage of variance they uniquely share either with lexical decision times or with naming times is much lower. It is 5 – 17% for gaze durations and lexical decision times in studies with target words presented in neutral sentences, but drops to .2% for corpus studies in which eye movements to all words are analysed. Correlations between gaze durations and naming latencies are lower still. These findings suggest that processing times in isolated word processing and continuous text reading are affected by specific task demands and presentation format, and that lexical decision times and naming times are not very informative in predicting eye movement latencies in text reading once the effect of word frequency and word length are taken into account. The difference between controlled experiments and natural reading suggests that reading strategies and stimulus materials may determine the degree to which the immediacy-of-processing assumption and the eye-mind assumption apply. Fixation times are more likely to exclusively reflect the lexical processing of the currently fixated word in controlled studies with unpredictable target words rather than in natural reading of sentences or texts
    • …
    corecore